Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Kumar, Amit; Ron-Zewi, Noga (Ed.)Let G = (V,E) be a graph on n vertices and let m^*(G) denote the size of a maximum matching in G. We show that for any δ > 0 and for any 1 ≤ k ≤ (1-δ)m^*(G), the down-up walk on matchings of size k in G mixes in time polynomial in n. Previously, polynomial mixing was not known even for graphs with maximum degree Δ, and our result makes progress on a conjecture of Jain, Perkins, Sah, and Sawhney [STOC, 2022] that the down-up walk mixes in optimal time O_{Δ,δ}(nlog{n}). In contrast with recent works analyzing mixing of down-up walks in various settings using the spectral independence framework, we bound the spectral gap by constructing and analyzing a suitable multi-commodity flow. In fact, we present constructions demonstrating the limitations of the spectral independence approach in our setting.more » « less
-
Abstract Many problems in combinatorial linear algebra require upper bounds on the number of solutions to an underdetermined system of linear equations $Ax = b$ , where the coordinates of the vector x are restricted to take values in some small subset (e.g. $$\{\pm 1\}$$ ) of the underlying field. The classical ways of bounding this quantity are to use either a rank bound observation due to Odlyzko or a vector anti-concentration inequality due to Halász. The former gives a stronger conclusion except when the number of equations is significantly smaller than the number of variables; even in such situations, the hypotheses of Halász’s inequality are quite hard to verify in practice. In this paper, using a novel approach to the anti-concentration problem for vector sums, we obtain new Halász-type inequalities that beat the Odlyzko bound even in settings where the number of equations is comparable to the number of variables. In addition to being stronger, our inequalities have hypotheses that are considerably easier to verify. We present two applications of our inequalities to combinatorial (random) matrix theory: (i) we obtain the first non-trivial upper bound on the number of $$n\times n$$ Hadamard matrices and (ii) we improve a recent bound of Deneanu and Vu on the probability of normality of a random $$\{\pm 1\}$$ matrix.more » « less
-
We introduce a notion called entropic independence that is an entropic analog of spectral notions of high-dimensional expansion. Informally, entropic independence of a background distribution $$\mu$$ on $$k$$-sized subsets of a ground set of elements says that for any (possibly randomly chosen) set $$S$$, the relative entropy of a single element of $$S$$ drawn uniformly at random carries at most $O(1/k)$ fraction of the relative entropy of $$S$$. Entropic independence is the analog of the notion of spectral independence, if one replaces variance by entropy. We use entropic independence to derive tight mixing time bounds, overcoming the lossy nature of spectral analysis of Markov chains on exponential-sized state spaces. In our main technical result, we show a general way of deriving entropy contraction, a.k.a. modified log-Sobolev inequalities, for down-up random walks from spectral notions. We show that spectral independence of a distribution under arbitrary external fields automatically implies entropic independence. We furthermore extend our theory to the case where spectral independence does not hold under arbitrary external fields. To do this, we introduce a framework for obtaining tight mixing time bounds for Markov chains based on what we call restricted modified log-Sobolev inequalities, which guarantee entropy contraction not for all distributions, but for those in a sufficiently large neighborhood of the stationary distribution. To derive our results, we relate entropic independence to properties of polynomials: $$\mu$$ is entropically independent exactly when a transformed version of the generating polynomial of $$\mu$$ is upper bounded by its linear tangent; this property is implied by concavity of the said transformation, which was shown by prior work to be locally equivalent to spectral independence. We apply our results to obtain (1) tight modified log-Sobolev inequalities and mixing times for multi-step down-up walks on fractionally log-concave distributions, (2) the tight mixing time of $$O(n\log n)$$ for Glauber dynamics on Ising models whose interaction matrix has eigenspectrum lying within an interval of length smaller than $$1$$, improving upon the prior quadratic dependence on $$n$$, and (3) nearly-linear time $$\widetilde O_{\delta}(n)$$ samplers for the hardcore and Ising models on $$n$$-node graphs that have $$\delta$$-relative gap to the tree-uniqueness threshold. In the last application, our bound on the running time does not depend on the maximum degree $$\Delta$$ of the graph, and is therefore optimal even for high-degree graphs, and in fact, is sublinear in the size of the graph for high-degree graphs.more » « less
An official website of the United States government

Full Text Available